Performance Analysis of the Gradient Comparator LMS Algorithm
نویسندگان
چکیده
The sparsity-aware zero attractor least mean square (ZA-LMS) algorithm manifests much lower misadjustment in strongly sparse environment than its sparsity-agnostic counterpart, the least mean square (LMS), but is shown to perform worse than the LMS when sparsity of the impulse response decreases. The reweighted variant of the ZA-LMS, namely RZALMS shows robustness against this variation in sparsity, but at the price of increased computational complexity. The other variants such as the l0LMS and the improved proportionate normalized LMS (IPNLMS), though perform satisfactorily, are also computationally intensive. The gradient comparator LMS (GC-LMS) is a practical solution of this trade-off when hardware constraint is to be considered. In this paper, we analyse the mean and the mean square convergence performance of the GC-LMS algorithm in detail. The analyses satisfactorily match with the simulation results.
منابع مشابه
Gradient Compared Lp-LMS Algorithms for Sparse System Identification
In this paper, we propose two novel p-norm penalty least mean square (lp-LMS) algorithms as supplements of the conventional lp-LMS algorithm established for sparse adaptive filtering recently. A gradient comparator is employed to selectively apply the zero attractor of p-norm constraint for only those taps that have the same polarity as that of the gradient of the squared instantaneous error, w...
متن کاملDigital LMS Adaptation of Analog Filters Without Gradient Information
The least mean square (LMS) algorithm has practical problems in the analog domain mainly due to dc offset effects. If digital LMS adaptation is used, a digitizer (analog-to-digital converter or comparator) is required for each gradient signal as well as the filter output. Furthermore, in some cases the state signals are not available anywhere in the analog signal path necessitating additional a...
متن کاملTracking performance of incremental LMS algorithm over adaptive distributed sensor networks
in this paper we focus on the tracking performance of incremental adaptive LMS algorithm in an adaptive network. For this reason we consider the unknown weight vector to be a time varying sequence. First we analyze the performance of network in tracking a time varying weight vector and then we explain the estimation of Rayleigh fading channel through a random walk model. Closed form relations a...
متن کاملAn Analytical Model for Predicting the Convergence Behavior of the Least Mean Mixed-Norm (LMMN) Algorithm
The Least Mean Mixed-Norm (LMMN) algorithm is a stochastic gradient-based algorithm whose objective is to minimum a combination of the cost functions of the Least Mean Square (LMS) and Least Mean Fourth (LMF) algorithms. This algorithm has inherited many properties and advantages of the LMS and LMF algorithms and mitigated their weaknesses in some ways. The main issue of the LMMN algorithm is t...
متن کاملPerformance characteristics of the median LMS adaptive filter
The performance of gradient search adaptive filters, such as the least mean squares (LMS) algorithm, may degrade badly when the filter is subjected to input signals which are corrupted by impulsive interference. The median LMS (MLMS) adaptive filter is designed to alleviate this problem by protecting the filter coefficients from the impact of the impulses. MLMS is a modification of LMS, obtaine...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1605.02877 شماره
صفحات -
تاریخ انتشار 2016